Web Survey Bibliography
Due to the fact, that at Universities in general all employees should be familiar with running a computer, it is assumed that an online-based interview among all university lecturers should represent a good alternative to the traditional paper and pencil form particularly with regard to costs terms. However, the effect this specific interview method may produce on the quality of the resulting sample as well as on the quality of the resulting answers was not yet clear. The following results from an employee survey of the University of Bremen shall give an answer to these questions.
Between November 2002 and February 2003 all university lecturers were invited to participate in an interview concerning the actual job situation and job satisfaction at the University. Determined at random, half of the interviewees got a traditional questionnaire whereas the other half was asked to fill in an online questionnaire. Both questionnaires were identical in order to estimate the actual effect of the different interview methods. In consequence this meant, that the questionnaire was elaborated for the traditional method but it was not optimised for the online one.
The results were analysed in two respects: with regard of the random sample and with regard of the content. The rate of return of about 47% already revealed differences due to specific interview methods: The rate of return of the traditional questionnaire was about 50% whereas only 43,4% of the online questionnaires were returned. Both rates are acceptable if one considers, that the binding character of the interview was not more than the simple interest of one employee of the University.
Analysing the two samples with regard of sex and position of the interviewees it appeared, that the rate of return among the women was higher than those of the men. As far as the position was concerned there was an effect of the interview method. The rate of return of traditional questionnaires among professors was higher than those of the online questionnaires. This effect could also be interpreted as an effect due to the age of the interviewees.
The analysis of the content should give answers to three main questions: Are there effects on interview methods concerning the response distribution of the questions? Are there effects on the degree of non-response? Are there specific effects on the answers of free-response questions?
The main result was that both interview types delivered equivalent quality of data. Nevertheless, some characteristics must be kept in mind.
Internetgestützte Befragungen der Lehrenden an Universitäten müssten durch die große Verbreitung der Technik und ihre selbstverständliche Nutzung eine gute Möglichkeit sein, relativ kurzfristig und kostengünstig durchführbar sein. Unklar blieben dabei bisher die Auswirkungen der spezifischen Erhebungsmethode auf die Güte der resultierenden Stichprobe und auf die Güte der resultierenden Antworten. Diesen Fragen wurde im Rahmen einer Mitarbeiterbefragung an der Universität Bremen nachgegangen.
Zwischen November 2002 und Februar 2003 waren alle Lehrenden aufgefordert, an einer Befragung zur Arbeitssituation und Arbeitszufriedenheit an der Universität teilzunehmen. Zufällig differenziert konnte die eine Hälfte der Angesprochenen den Fragebogen schriftlich beantworten, während die andere Hälfte gebeten wurde, den Fragebogen im Internet zu beantworten. Um den tatsächlichen Effekt der Erhebungsmethode zu schätzen war der Fragebogen für beide Varianten konstant gehalten. Das bedeutete in der Konsequenz, dass er zwar für die schriftliche Fassung, nicht aber für die netzgestützte optimiert war.
Analysiert wurden die Ergebnisse in zwei Hinsichten: mit Blick auf die Stichprobe und inhaltlich. Bei einer Gesamtrücklaufquote von fast 47% zeigten sich erhebungsspezifische Unterschiede: für die schriftliche Befragung ist ein Rücklauf von 50% zu berichten; für die netzgestützte Variante von 43,4%. Beide Rückläufe sind vor dem Hintergrund der Verbindlichkeit der Teilnahme, die sich lediglich aus einem Forschungsinteresse einer Universitätsmitarbeiterin speiste als akzeptabel einzuschätzen.
Betrachtet man die beiden Stichproben nach dem Geschlecht und der Position der Befragten zeigt sich darüber hinaus, dass für Frauen in beiden Befragungsformen ein höherer Rücklauf konstatiert werden kann als für Männer und dass es einen Erhebungsformeffekt für die Position gibt. In der schriftlichen Variante haben Professoren einen deutlich höheren Rücklauf als andere Lehrenden, in der Online-Variante kehrt sich dieser Befund um. Dies kann auch als Alterseffekt interpretiert werden.
Bezogen auf die inhaltliche Fragestellung sollen insbesondere drei Fragen beantwortet werden: Gibt es Erhebungsformeffekte auf die Antwortverteilungen der Fragen? Gibt es Auswirkungen auf das Ausmaß fehlender Fälle? Gibt es spezifische Effekte auf die Beantwortung offener Fragen?
Im Ergebnis wird festgestellt werden, dass netzgestützte Lehrendenbefragungen zu einer äquivalenten Datenqualität wie schriftliche Befragungen führen, dass aber gleichzeitig einige Besonderheiten zu beachten sind.
Homepage - conference (abstract)
Web survey bibliography (361)
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- Comparison of response patterns in different survey designs: a longitudinal panel with mixed-mode and...; 2017; Ruebsamen, N.; Akmatov, M. K.; Castell, S.; Karch, A.; Mikolajczyk, R. T.
- Mobile Research im Kontext der digitalen Transformation; 2017; Friedrich-Freksa, M.
- Kognitives Pretesting; 2017; Neuert, C.
- Grundzüge des Datenschutzrechts und aktuelle Datenschutzprobleme in der Markt- und Sozialforschung; 2017; Schweizer, A.
- Article Establishing an Open Probability-Based Mixed-Mode Panel of the General Population in Germany...; 2017; Bosnjak, M.; Dannwolf, T.; Enderle, T.; Schaurer, I.; Struminskaya, B.; Tanner, A.; Weyandt, K.
- Socially Desirable Responding in Web-Based Questionnaires: A Meta-Analytic Review of the Candor Hypothesis...; 2016; Gnambs, T.; Kaspar, K.
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Predicting and Preventing Break-Offs in Web Surveys; 2016; Mittereder, F.
- Incorporating eye tracking into cognitive interviewing to pretest survey questions; 2016; Neuert, C.; Lenzner, T.
- Geht’s auch mit der Maus? – Eine Methodenstudie zu Online-Befragungen in der Jugendforschung...; 2016; Heim, R.; Konowalczyk, S.; Grgic, M.; Seyda, M.; Burrmann, U.; Rauschenbach, T.
- Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?; 2016; Meitinger, K., Behr, D.
- Device Effects - How different screen sizes affect answers in online surveys; 2016; Fisher, B.; Bernet, F.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment; 2016; Hoehne, J. K.; Schlosser, S.; Krebs, D.
- Secondary Respondent Consent in the German Family Panel; 2016; Schmiedeberg, C.; Castiglioni, L.; Schroeder, J.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- The Mobile Web Only Population: Socio-demographic Characteristics and Potential Bias ; 2016; Fuchs, M.; Metzler, A.
- The Impact of Scale Direction, Alignment and Length on Responses to Rating Scale Questions in a Web...; 2016; Keusch, F.; Liu, M.; Yan, T.
- Web Surveys Versus Other Survey Modes: An Updated Meta-analysis Comparing Response Rates ; 2016; Wengrzik, J.; Bosnjak, M.; Lozar Manfreda, K.
- Retrospective Measurement of Students’ Extracurricular Activities with a Self-administered Calendar...; 2016; Furthmueller, P.
- Privacy Concerns in Responses to Sensitive Questions. A Survey Experiment on the Influence of Numeric...; 2016; Bader, F., Bauer, J., Kroher, M., Riordan, P.
- Ballpoint Pens as Incentives with Mail Questionnaires – Results of a Survey Experiment; 2016; Heise, M.
- Does survey mode matter for studying electoral behaviour? Evidence from the 2009 German Longitudinal...; 2016; Bytzek, E.; Bieber, I. E.
- Forecasting proportional representation elections from non-representative expectation surveys; 2016; Graefe, A.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments; 2016; Struminskaya, B.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- The impact of frequency rating scale formats on the measurement of latent variables in web surveys -...; 2015; Menold, N.; Kemper, C. J.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Implementation of the forced answering option within online surveys: Do higher item response rates come...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Translating Answers to Open-ended Survey Questions in Cross-cultural Research: A Case Study on the Interplay...; 2015; Behr, D.
- The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability...; 2015; Bosnjak, M.; Struminskaya, B.; Weyandt, K.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Measuring Political Knowledge in Web-Based Surveys: An Experimental Validation of Visual Versus Verbal...; 2015; Munzert, S.; Selb, P.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face...; 2015; Struminskaya, B.; De Leeuw, E. D.; Kaczmirek, L.
- Higher response rates at the expense of validity? Consequences of the implementation of the ‘forced...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- A quasi-experiment on effects of prepaid versus promised incentives on participation in a probability...; 2015; Schaurer, I.; Bosnjak, M.
- Response Effects of Prenotification, Prepaid Cash, Prepaid Vouchers, and Postpaid Vouchers: An Experimental...; 2015; van Veen, F.; Goeritz, A.; Sattler, S.
- Recruiting Respondents for a Mobile Phone Panel: The Impact of Recruitment Question Wording on Cooperation...; 2015; Busse, B.; Fuchs, M.
- The Influence of the Answer Box Size on Item Nonresponse to Open-Ended Questions in a Web Survey ; 2015; Zuell, C.; Menold, N.; Koerber, S.